Coordinate Descent with Coupled Constraints

نویسنده

  • Aman Sinha
چکیده

Introduction For many big data applications, a relatively small parameter vector θ ∈ Rn is determined to fit a model to a very large dataset with N observations. We consider a different motivating problem in which both n and N are large. Thus, both batch optimization techniques and many stochastic techniques that require working with the entire θ vector (e.g. mirror descent methods) are too inefficient. Coordinate descent methods work only with a single coordinate of θ at each iteration. However, standard coordinate descent methods typically assume that the constraint sets for different coordinates are independent. We perform coordinate descent in the presence of coupled constraints. Specifically, we propose a simple algorithm that works with a group of k coordinates at each iteration; the minimum sufficient value of k depends on the constraint set. We prove convergence of the algorithm for polyhedral constraint sets and show experimental results over the the probability simplex.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Large-scale randomized-coordinate descent methods with non-separable linear constraints

We develop randomized block coordinate descent (CD) methods for linearly constrained convex optimization. Unlike other large-scale CD methods, we do not assume the constraints to be separable, but allow them be coupled linearly. To our knowledge, ours is the first CD method that allows linear coupling constraints, without making the global iteration complexity have an exponential dependence on ...

متن کامل

A Random Coordinate Descent Method on Large-scale Optimization Problems with Linear Constraints

In this paper we develop a random block coordinate descent method for minimizing large-scale convex problems with linearly coupled constraints and prove that it obtains in expectation an ε-accurate solution in at most O( 1 ε ) iterations. However, the numerical complexity per iteration of the new method is usually much cheaper than that of methods based on full gradient information. We focus on...

متن کامل

A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints

In this paper we propose a variant of the random coordinate descent method for solving linearly constrained convex optimization problems with composite objective functions. If the smooth part of the objective function has Lipschitz continuous gradient, then we prove that our method obtains an ε-optimal solution in O(N/ε) iterations, where N is the number of blocks. For the class of problems wit...

متن کامل

Coordinate Descent Algorithms With Coupling Constraints: Lessons Learned

Coordinate descent methods are enjoying renewed interest due to their simplicity and success in many machine learning applications. Given recent theoretical results on random coordinate descent with linear coupling constraints, we develop a software architecture for this class of algorithms. A software architecture has to (1) maintain solution feasibility, (2) be applicable to different executi...

متن کامل

A Random Coordinate Descent Algorithm for Singly Linear Constrained Smooth Optimization∗

In this paper we develop a novel randomized block-coordinate descent method for minimizing multi-agent convex optimization problems with singly linear coupled constraints over networks and prove that it obtains in expectation an ε accurate solution in at most O( 1 λ2(Q)ε ) iterations, where λ2(Q) is the second smallest eigenvalue of a matrix Q that is defined in terms of the probabilities and t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015